Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
1.7T pretraining
# 1.7T pretraining
Tanuki 8x8B Dpo V1.0
Apache-2.0
Tanuki-8x8B is a large-scale language model pretrained from scratch, optimized for dialogue tasks through SFT and DPO
Large Language Model
Transformers
Supports Multiple Languages
T
weblab-GENIAC
217
38
Featured Recommended AI Models
Empowering the Future, Your AI Solution Knowledge Base
English
简体中文
繁體中文
にほんご
© 2025
AIbase